課程資訊
課程名稱
數據分析與流形學習
Data Analysis and Manifold Learning 
開課學期
110-2 
授課對象
電機資訊學院  資料科學碩士學位學程  
授課教師
林澤佑 
課號
Data5008 
課程識別碼
946 U0080 
班次
 
學分
3.0 
全/半年
半年 
必/選修
選修 
上課時間
星期四8,9,10(15:30~18:20) 
上課地點
新501 
備註
需自備筆電上課操作。
限碩士班以上
總人數上限:30人 
 
課程簡介影片
 
核心能力關聯
核心能力與課程規劃關聯圖
課程大綱
為確保您我的權利,請尊重智慧財產權及不得非法影印
課程概述

第一週 Basic graph theory and matrices associated with a graph
第二週 Types of Special Matrices and Matrix Eigenvalue Problems
第三週 Introduction to spectral and graph-based methods
第四週 Exam I; PCA and SVD
第五週 Fisher Linear Discriminant
第六週 Multidimensional scaling
第七週 Locally Linear Embedding
第八週 ISOMAP
第九週 Laplacian embedding and spectral clustering
第十週 Exam II; Introduction to kernel method
第十一週 Kernel PCA
第十二週 Diffusion kernels
第十三週 Introduction to manifold reconstruction
第十四週 Exam III
第十五週 Final project
第十六週 Final project 

課程目標
Manifold learning is a branch of machine learning to study (non)linear dimensionality reduction which are often applied for data preprocessing in data science field. One purpose of dimensionality reduction is to represent data of interest from a high-dimensional space into a low-dimensional space so that we could extract hidden information from data with fewer number of coordinates.

In this course, we will briefly introduce some famous manifold learning techniques and apply them using Python. Each topic will be covered by both theoretical and practical parts. 
課程要求
Prerequisite: Calculus, Probability, Linear Algebra, Elementary Geometry, Python 
預期每週課後學習時數
 
Office Hours
 
指定閱讀
待補 
參考書目
Lecture will be based on the slides.

Reference:
[1] Belkin, Mikhail, and Partha Niyogi. "Laplacian eigenmaps for dimensionality reduction and data representation." Neural computation 15.6 (2003): 1373-1396.
[2] Borg, Ingwer, and Patrick JF Groenen. Modern multidimensional scaling: Theory and applications. Springer Science & Business Media, 2005.
[3] Boumal, Nicolas. "An introduction to optimization on smooth manifolds." Available online, Aug (2020).
[4] De la Porte, J., et al. "An introduction to diffusion maps." Proceedings of the 19th Symposium of the Pattern Recognition Association of South Africa (PRASA 2008), Cape Town, South Africa. 2008.
[5] Fefferman, Charles, et al. "Reconstruction and interpolation of manifolds. I: The geometric Whitney problem." Foundations of Computational Mathematics 20.5 (2020): 1035-1133.
[6] Jolliffe, Ian. "Principal component analysis." Encyclopedia of statistics in behavioral science (2005).
[7] Ma, Yunqian, and Yun Fu. Manifold learning theory and applications. Vol. 434. Boca Raton, FL: CRC press, 2012.
[8] Saul, Lawrence K., et al. "Spectral methods for dimensionality reduction." Semi-supervised learning 3 (2006).
[9] Saul, Lawrence K., and Sam T. Roweis. "An introduction to locally linear embedding." unpublished. Available at: http://www. cs. toronto. edu/~ roweis/lle/publications. html (2000).
[10] Tenenbaum, Joshua B., Vin De Silva, and John C. Langford. "A global geometric framework for nonlinear dimensionality reduction." science 290.5500 (2000): 2319-2323.
[11] Van Der Maaten, Laurens, Eric Postma, and Jaap Van den Herik. "Dimensionality reduction: a comparative." J Mach Learn Res 10.66-71 (2009): 13. 
評量方式
(僅供參考)
 
No.
項目
百分比
說明
1. 
Homework 
35% 
 
2. 
Exam I 
10% 
 
3. 
Exam II 
10% 
 
4. 
Exam III 
15% 
 
5. 
Project 
30% 
 
 
課程進度
週次
日期
單元主題